ACS Publications Division - to list of Journals
Pharmaceutical Centuryto Pharmaceutical Century home
Analytical Chemistry | Chemical & Engineering News | Modern Drug Discovery
| Today's Chemist at Work | E-Mail Us | Electronic Readers Service 
1960s
opening art
Corporate Sponsors
Albemarle Corp.
Boehringer Ingelheim Pharma KG
Great Lakes Fine Chemicals
Hamilton Co. 
Hewlett-Packard
Immutopics, Inc.
Pharmacia & Upjohn, Inc.
Roche Bioscience
SK Energy & Chemical, Inc. Specialty Chemicals
Anodynes & estrogens

Introduction   Mention the Sixties and there are varied “hot-button” responses—“JFK, LBJ, civil rights, and Vietnam” or “sex, drugs, and rock ’n’ roll.” But it was all of a piece. Politics and culture mixed like the colors of a badly tie-dyed t-shirt. But in this narrative, drugs are the hallmark of the era—making them, taking them, and dealing with their good and bad consequences. Ever after this era, the world would be continually conscious of pills, pills, pills—for life, for leisure, and for love. In many ways, the Sixties was the Pharmaceutical Decade of the Pharmaceutical Century.

A plethora of new drugs was suddenly available: the Pill was first marketed; Valium and Librium debuted to soothe the nerves of housewives and businessmen; blood-pressure drugs and other heart-helping medications were developed. Another emblem of the 1960s was the development of worldwide drug abuse, including the popularization of psychotropic drugs such as LSD by “gurus” like Timothy Leary. The social expansion of drugs for use and abuse in the 1960s forever changed not only the nature of medicine but also the politics of nations.

The technology of drug discovery, analysis, and manufacture also proliferated. New forms of chromatography became available, including HPLC, capillary GC, GC/MS, and the rapid expansion of thin-layer chromatography techniques. Proton NMR was developed to analyze complex biomolecules. By the end of the decade, amino acid analyzers were commonplace, and the ultracentrifuge was fully adapted to biomedical uses. Analytical chemistry and biology joined as never before in the search for new drugs and analysis of old ones.

And of equal importance, a new progressivism took the stage, especially in the United States, where increasing demand for access to health care and protection from unsafe and fraudulent medications once more led to an increased federal presence in the process of drug development, manufacture, and sale.

Popping the Pill
If there was any single thing that foreshadowed the tenor of the decade, one medical advance was paramount: In 1960 the first oral contraceptive was mass-marketed. Sex and drugs were suddenly commingled in a single pill, awaiting only the ascendance of rock ’n’ roll to stamp the image of the decade forever. Born of the Pill, the sexual revolution seemed to be in good measure a pharmaceutical one.

The earlier achievements of women’s suffrage and the growing presence of women in the labor force were somewhat blocked from further development by the demands of pregnancy and child rearing in a male-dominated society. Feminist historians recount how hopes were suddenly energized by the ability of women to control their own bodies chemically and thus, socially. Chemical equality, at least for those who could afford it and, in the beginning, find the right doctors to prescribe it, was at last available.

However, it was not a uniformly smooth process. First and foremost, despite its popularity, the technology for tinkering with women’s reproductive systems had not been fully worked out. In Britain in 1961, there were problems with birth control pills with excess estrogens. Similar problems also occurred in the United States. Dosage changes were required for many women; side effects debilitated a few.

But still the sexual revolution marched on, as was documented in the 1966 Masters and Johnson report Human Sexual Response, which showed a transformation of female sexuality, new freedoms, and new attitudes in both sexes. Furthermore, technology was not done fiddling with reproduction by any means. In 1969, the first test-tube fertilization was performed.

Valium of the dolls
In an era that would be far from sedate, the demand for sedatives was profound, and the drug marketplace responded rapidly. Although Miltown (meprobamate), the first of the major “tranks,” was called the Wonder Drug of 1954, sedatives weren’t widely used until 1961, when Librium (a benzodiazepine) was discovered and marketed as a treatment for tension. Librium proved a phenomenal success. Then Valium (diazepam), discovered in 1960, was marketed by Roche Laboratory in 1963 and rapidly became the most prescribed drug in history.

These drugs were touted to the general population and mass-marketed and prescribed by doctors with what many claimed was blithe abandon. While the youth of America were smoking joints and tripping on acid, their parents’ generation of businessmen and housewives were downing an unprecedented number of sedatives. According to the Canadian Government Commission of Inquiry into the Nonmedical Use of Drugs (1972), “In 1965 in the USA, some 58 million new prescriptions and 108 million refills were written for psychotropes (sedatives, tranquilizers, and stimulants), and these 166 million prescriptions accounted for 14% of the total prescriptions of all kinds written in the United States.” Physical and psychological addiction followed for many. Drug taking became the subject of books and movies. In the 1966 runaway best-seller Valley of the Dolls by Jacqueline Susann, the “dolls” were the pills popped by glamorous upper-class women in California. Eventually, the pills trapped them in a world of drug dependence that contributed to ruining their lives.

Drug wars
It was only a matter of time before the intensive testing of LSD by the military in the 1950s and early 1960s—as part of the CIA’s “Project MKULTRA”—spread into the consciousness of the civilian population.

By 1966, the chairman of the New Jersey Narcotic Drug Study Commission declared that LSD was “the greatest threat facing the country today… more dangerous than the Vietnam War.” In the United States, at least at the federal level, the battle against hallucinogens and marijuana use was as intense as, if not more intense than, the fight against narcotic drugs. 
BIOGRAPHY: FDA heroine 
In 1961, after developing concerns in her first review case, Frances Kelsey, M.D., held up FDA approval for a new sedative, thalidomide. Then, birth defect tragedies linked to thalidomide came to light in Canada and Europe. On July 15, 1962, the Washington Post ran a story headlined “Heroine of the FDA Keeps Bad Drug Off Market.” Kelsey was soon famed as the woman who had spared her country a nightmare, and President Kennedy awarded her the Distinguished Federal Civilian Service medal. On October 10, 1962, Kennedy signed the Kefauver–Humphrey Drug Amendment, which required that drug companies send reports of bad reactions to the FDA and that drug advertising mention harmful as well as beneficial effects.

According to some liberal critics, this was because these “recreational” drugs were a problem in the middle and the upper classes, whereas narcotic addiction was the purview of the poor.

From the start of the decade, in the United States and around the world, regions with large populations of urban poor were more concerned about the growing problem of narcotic drug addiction. In 1961, with the passage of the UN Single Convention on Narcotic Drugs, signatory nations agreed to processes for mandatory commitment of drug users to nursing homes. In 1967, New York State established a Narcotics Addiction Control Program that, following the UN Convention, empowered judges to commit addicts into compulsory treatment for up to five years. The program cost $400 million over just three years but was hailed by Governor Nelson Rockefeller as the “start of an unending war.” Such was the measure of the authorities’ concern with seemingly out-of-control drug abuse. The blatant narcotics addictions of many rock stars and other celebrities simultaneously horrified some Americans and glamorized the use of “hard” drugs among others, particularly young people.

By 1968, the Food and Drug Administration (FDA) Bureau of Drug Abuse Control and the Treasury Department’s Bureau of Narcotics were fused and transferred to the Department of Justice to form the Bureau of Narcotics and Dangerous Drugs in a direct attempt to consolidate the policing of traffic in illegal drugs. Also in 1968, Britain passed the Dangerous Drug Act to regulate opiates. As these efforts to stop drug use proliferated, technology for mass-producing many of these same drugs continued to improve in factories around the world. By 1968, Canada was producing nearly 56 million doses of amphetamines; by 1969, the United States was producing more than 800,000 pounds of barbiturates.

Forging a great society
The 1960 election of a Democratic administration in the United States created a new demand for government intervention in broader areas of society. John F. Kennedy and his successor, Lyndon B. Johnson, expanded federal intervention in a host of previously unregulated areas, both civil and economic, including food and medicine. 
SOCIETY: The “great” paternalism
Starting with the Kennedy administration in 1961, a new activist government in the United States posited the need for expanding federal intervention in a host of previously unregulated areas, both civil and economic. The necessity of added governmental regulation of food and the healthcare system seemed overwhelmingly apparent. The Kennedy administration’s healthcare plans and the tenets of Johnson’s “Great Society”—which was ultimately responsible for both Medicare and Medicaid—set the stage for increased federal funding for those unable to afford health care. These expanded dollars not only provided impetus for pharmaceutical research, they also created inflationary pressures on health care when coupled with the difficulties of simultaneously funding the Vietnam War.

Around the world, a new social agenda demanding rights for minorities (especially apparent in the civil rights struggles in the United States) and women (made possible by freedoms associated with the Pill) fostered a new focus on protecting individuals and ending, or at least ameliorating, some of the damage done to hitherto exploited social classes. Health and medicine, a prime example, led to what many disparagingly called “government paternalism.” This same liberal agenda, in its darker moments, moved to protect less-developed nations from the perceived global communist threat—hence the Vietnam war. Ultimately, there was neither the money nor the will to finance internal social and external political agendas. General inflation resulted, with specific increases in the cost of medical care.

Perhaps one of the most significant long-term changes in drug development procedures of the era, especially regarding the role of governments, came with a new desire to protect human “guinea pigs.” General advocacy for the poor, women, and minorities led to a reexamination of the role of the paternalistic, (generally) white male clinicians in the morally repugnant treatment of human subjects throughout the century, before and after the Nuremberg trials of Nazi doctors. It was a profound catalyst for the modern bioethics movement when health groups worldwide established new regulations regarding informed consent and human experimentation in response to specific outrages.

In 1962, the Kefauver–Harris amendments to the Federal Food, Drug, and Cosmetic Act of 1938 were passed to expand the FDA’s control over the pharma and food industries. The Kefauver amendments were originally the outgrowth of Senate hearings begun in 1959 to examine the conduct of pharmaceutical companies. According to testimony during those hearings, it was common practice for these companies “to provide experimental drugs whose safety and efficacy had not been established to physicians who were then paid to collect data on their patients taking these drugs. Physicians throughout the country prescribed these drugs to patients without their control or consent as part of this loosely controlled research.” That the amendments were not passed until 1962 was in part because of the profound battle against allowing additional government control of the industry. The 1958 Delaney proviso and the 1960 Color Additive Amendment led to industry and conservative resentment and complaints that the FDA was gaining too much power.

However, with the 1961 birth defects tragedy involving the popular European sedative thalidomide (prevented from being marketed in the United States by FDA researcher Frances Kelsey), public demand for greater protections against experimental agents was overwhelming. Thalidomide had been prescribed to treat morning sickness in countless pregnant women in Europe and Canada since 1957, but its connection to missing and deformed limbs in newborns whose mothers had used it was not realized until the early 1960s. In 1961, televised images of deformed “thalidomide babies” galvanized support for the FDA, demonstrating the power of this new medium to transform public opinion, as it had in the 1960 Nixon–Kennedy debates and would again in the Vietnam War years.

Building on the newfound public fears of synthetic substances, Rachel Carson’s 1962 book Silent Spring precipitated the populist environmental movement. As with the 1958 Delaney clause and the Federal Hazardous Substances Labeling Act of 1960, it was all part of increased public awareness of the potential dangers of the rapid proliferation of industrial chemicals and drugs. According to the 1962 amendments to the 1938 Food, Drug, and Cosmetic Act, new drugs had to be shown to be effective, prior to marketing, by means to be determined by the FDA. Ultimately, this requirement translated to defined clinical trials. However, Congress still vested excessive faith in the ethics of physicians by eliminating the need for consent when it was “not feasible” or was deemed not in the best interest of the patient; these decisions could be made “according to the best judgment of the doctors involved.”

Despite the fact that President Kennedy proclaimed a Consumer Bill of Rights—including the rights to safety and to be informed—more stringent federal guidelines to protect research subjects were not instituted until 1963. Then the NIH responded to two egregious cases: At Tulane University, a chimpanzee kidney was unsuccessfully transplanted into a human with the patient’s consent but without medical review. At the Brooklyn Jewish Chronic Disease Hospital, in association with the Sloan-Kettering Cancer Research Institute, live cancer cells were injected into indigent, cancer-free elderly patients.

Particularly important to the public debate was the disclosure of 22 examples of potentially serious ethical violations found in research published in recent medical journals. The information was presented by Henry Beecher of Harvard Medical School to science journalists at a 1965 conference sponsored by the Upjohn pharmaceutical company and was later published in the New England Journal of Medicine.

In 1964, the World Medical Association issued its Declaration of Helsinki, which set standards for clinical research and demanded that subjects be given informed consent before enrolling in an experiment. By 1966, in the United States, the requirement for informed consent and peer review of proposed research was written into the guidelines for Public Health Service–sponsored research on human subjects. These regulations and the debate surrounding human experimentation continued to evolve and be applied more broadly.

And in an attempt to protect consumers from unintentional waste or fraud, in 1966, wielding its new power from the Kefauver amendments, the FDA contracted with the National Academy of Sciences/National Research Council to evaluate the effectiveness of 4000 drugs approved on the basis of safety alone between 1938 and 1962.

The U.S. government faced other health issues with less enthusiasm. In 1964, the Surgeon General’s Report on Smoking was issued, inspired in part by the demand for a response to the Royal College of Surgeons’ report in Britain from the year before. Although the Surgeon General’s report led to increased public awareness and mandated warning labels on tobacco products, significant government tobacco subsidies, tax revenues, smoking, and smoking-related deaths continued in the United States. The debate and associated litigation go on to this day.

Heart heroics and drugs
The lungs were not the only vital organs to attract research and publicity in the Sixties. From heart transplants and blood-pressure medications to blood-based drugs, a new kind of “heroic” medicine focused technology on the bloodstream and its potential for helping or hindering health.

Heart surgery captivated the public with its bravado. In 1960, the transistorized self-contained pacemaker was introduced. In the 1960s, organ transplantation became routinely possible with the development of the first immune suppressant drugs. In 1967, the coronary bypass operation was developed by Michael DeBakey; in that same year, physician (and some would say showman) Christiaan Barnard performed the first heart transplant. In 1968, angioplasty was introduced for arterial treatment and diagnosis. 
TECHNOLOGY: Cracking the code
For all the hoopla over decoding the structure of DNA in the 1950s, it was still a molecule without a mission. How did genes make proteins? How were they regulated? Although the role of mRNA was elucidated in the late 1950s, it wasn’t until 1961 that Marshall Nirenberg and Heinrich Matthaei at the NIH used an artificially created poly-U mRNA in a cell-free system to synthesize the novel protein poly(phenylalanine). By 1965, mix-and-match experiments with other artificial mRNAs established that there was a unique degenerate triplet codon for each amino acid and identified all of these triplet sequences that make up the genetic code. 

By 1964, Pamela Abel and T. A. Trautner demonstrated the universality of the genetic code, from bacteria to the “higher” forms of life. The genetic code’s elegance and practical potential inspired Linus Pauling to propose molecular fingerprinting using protein and DNA sequences as a means to identify organisms. By 1969, Pauling’s idea was bearing clinical fruit: Enterobacteria isolates proved classifiable using comparative DNA hybridization. This development was not only a major breakthrough in species analysis, but the prelude to a universe of molecular diagnostics.

But for all the glamour of surgery, chemicals for controlling heart disease provided far more widespread effects. Blood-pressure drugs based on new knowledge of human hormone systems became available for the first time. In 1960, guanethidine (a noradrenaline release inhibitor) was developed for high blood pressure; in rapid succession the first beta-adrenergic blocker appeared in Britain (1962), and alpha-methyldopa, discovered in 1954, was first used clinically in the early 1960s for treating high blood pressure by interfering not with noradrenaline release, but with its synthesis. In 1964, methods were perfected to nourish individuals through the bloodstream. This ability to feed intravenously and provide total caloric intake for debilitated patients forever changed the nature of coma and the ethics of dealing with the dying. The use of blood-thinning and clot-dissolving compounds for heart disease was also pioneered in the early 1960s: Streptokinase and aspirin reduced deaths by 40% when taken within a few hours of a heart attack.

In the late Sixties, as methods of fractionation using centrifugation and filtration improved dramatically, concentrated blood factors became readily available for the first time. Plasmapheresis—centrifuging the red blood cells from plasma and then returning the whole cells to the donor—allowed donations to occur twice weekly instead of every few months. Blood proteins such as “Big-D” Rh antibodies used to immunize women immediately after the birth of their first Rh-positive child, albumin, gamma globulins, blood-typing sera, and various clotting factors such as factor VIII created a booming industry in plasma products. But it also made blood all the more valuable as a commodity.

Abuses increased in collecting, distributing, and manufacturing blood products, as recounted by Douglas Starr in his 1999 book Blood. In 1968, in a move that foreshadowed the AIDS era yet to come, the United States revoked all licenses for consumer sales of whole plasma prepared from multiple donors because of fears that viral hepatitis would spread. Fears were exacerbated by scandalous revelations about the health status (or lack thereof) of many paid blood donors. This donor problem became even more newsworthy in the early 1970s as malnourished street hippies, drug abusers, unhealthy indigents, and prisoners were revealed as sources often contaminating the world blood supply.

High tech/new mech
In the 1960s, analytical chemists increasingly turned their attention to drug discovery and analysis and to fundamental questions of biological significance in physiology and genetics. New technologies were developed, and previously designed instruments were adapted to biomedical applications.

The development of high-pressure (later known as high-performance) liquid chromatography (HPLC) heralded a new era of biotechnology and allowed advanced separations of fragile macromolecules in fractions of the time previously required. The radioimmune assay, first developed in 1959 by Rosalyn Yalow and Solomon Berson, was perfected in 1960. Tissue culture advances proliferated, allowing more and better in vitro testing of drugs; and, when coupled with radiotracer and radioimmunoassay experiments, led to unprecedented breakthroughs in all areas of mammalian physiology. In 1964, for example, Keith Porter and Thomas F. Roch discovered the first cell membrane receptor. Further developments in analytical chemistry came as gas chromatography (GC) was first linked with mass spectrometry (MS), providing a quantum leap in the ability to perform structural analysis of molecules. And laboratory automation, including primitive robotics, became a powerful trend.

But perhaps the most important breakthrough in tissue culture, and one that created a direct path to the Human Genome Project, was the invention of somatic-cell hybridization by Mary Weiss and Howard Green in 1965. By fusing mouse and human cells together via the molecular “glue” of Sendai virus, these researchers and others quickly developed a series of cell lines containing mostly mouse chromosomes but with different single human ones, all expressing unique proteins. For the first time, human proteins could be assigned to individual human chromosomes (and later chromosome fragments) to the degree that gene mapping was finally possible in humans. Of comparable importance, new improvements in fermentation technology allowed continuous cycling and easy sterilization along with mass-produced instrumentation.

Fundamental breakthroughs in the etiology of disease transformed the understanding of infection. In 1961, the varying polio virus receptors were correlated to pathogenicity of known isolates; in 1967, diphtheria toxin’s mode of action was finally determined and provided the first molecular definition of a bacterial protein virulence factor.

Structural biology proceeded at an enthusiastic pace. In 1960, John Kendrew reported the first high-resolution X-ray analysis of the three-dimensional structure of a protein—sperm whale myoglobin. In the 1960s, image analyzers were linked to television screens for the first time, enhancing the use and interpretation of complex images. And in 1967, Max Perutz and Hilary Muirhead built the first high-resolution model of the atomic structure of a protein (oxyhemoglobin), which promoted a wave of protein structural analysis. Computer systems became increasingly powerful and quickly indispensable to all laboratory processes, but especially to molecular analysis.

Hardly falling under the category of miscellaneous discoveries, at least in terms of the development of biotechnology and the pharmaceutical industry, was the development of agarose gel electrophoresis in 1961. This requirement was critical for separating and purifying high molecular weight compounds, especially DNA; in 1963, to the benefit of a host of laboratory workers, the first film badge dosimeter was introduced in the United Kingdom; and in 1966, the disc-diffusion standardized test was developed for evaluating antibiotics, which was a boon to the exploding pharmaceutical development of such compounds.

Dancing with DNA
Finally, the understanding of the structure of DNA and acceptance that it was indeed genetic material yielded intelligible and practical results.

At the beginning of the decade, researchers A. Tsugita and Heinz Fraenkel-Conrat demonstrated the link between mutation and a change in the protein produced by the gene. Also in 1960, Francois Jacob and Jacques Monod proposed their operon model. This was the birth of gene regulation models, which launched a continuing quest for gene promoters and triggering agents, such that by 1967, Walter Gilbert and co-workers identified the first gene control (repressor) substance.

Perhaps most significantly, the prelude to biotechnology was established when restriction enzymes were discovered, the first cell-free DNA synthesis was accomplished by biochemist Arthur Kornberg in 1961, and the DNA–amino acid code was deciphered.

Deciphering the genetic code was no minor feat. Finding the mechanism for going from gene to protein was a combination of brilliant theorizing and technological expertise.

The technology necessary for the dawn of biotechnology proliferated as well. Throughout the 1960s, automated systems for peptide and nucleic acid analysis became commonplace. In 1964, Bruce Merrifield invented a simplified technique for protein and nucleic acid synthesis, which was the basis for the first such machines. (In the 1980s, this technique would be mass-automated for gene synthesis.) And in 1967, the first specific gene transfer was accomplished; the lac operon was functionally transferred from E. coli to another bacterial species. And, although its importance was not realized at the time, in 1967, Thermus aquaticus was discovered in a hot spring in Yellowstone National Park. This microbe was the first Archea ever found and the ultimate source of heat-stable taq polymerase—the enabling enzyme for modern PCR.

By the late 1960s, not only had the first complete gene been isolated from an organism, but biologists were already debating the ethics of human and animal genetic engineering.

Generation of drug-takers
In the 1960s, the post–World War II generation of baby boomers entered their teenage years. These were the children of vitamins, antibiotics, hormones, vaccines, and fortified foods such as Wonder Bread and Tang. The technology that won the war had also transformed the peace and raised high expectations on all social fronts, including, and perhaps especially, health. The unparalleled prosperity of the 1950s was largely driven by new technologies and a profusion of consumer goods. This prosperity, coupled with a new political progressivism, created a euphoric sense of possibility early in the decade, especially with regard to health care. The prevailing belief was that medicine would save and society would provide. Although the dream ultimately proved elusive, its promise permeated the following decades—Medicare, Medicaid, and a host of new regulations were the lingering aftereffects in government, a generation of drug-takers the result in society at large.

The Sabin oral polio vaccine was approved in the United States in 1960 after trials involving 100 million people overseas and promised salvation to a new generation from this former scourge. In 1961, the sweetener cyclamate was introduced in the first low-calorie soft drink, Diet Rite, and it created demand for consumption without cost, or at least without weight gain. In 1964, a suitable, routine vaccine for measles was developed that was much better than its predecessor vaccine first produced in 1960. In 1967, a live-virus mumps vaccine was developed. Faith that childhood diseases could be stamped out grew rapidly. The Surgeon General of the United States even went so far as to state that we were coming near—for the first time in history—to finally “closing the books on infectious diseases.”

From today’s perspective, these seem like naive hopes. But they were not without some foundation. Antibiotics had yet to lose effectiveness, and more were being discovered or synthesized. In 1966, the first antiviral drug, amantadine-HCl, was licensed in the United States for use against influenza. In 1967, the who began efforts to eradicate smallpox. The rhetoric of nongovernmental antipolio and antituberculosis groups provided additional reason for optimism. 
Suggested reading
  • Pills-A-Go-Go: A Fiendish Investigation into Pill Marketing, Art, History & Consumption Hogshire, J. (Feral House: Venice, CA, 1999) 
  • The Greatest Benefit to Mankind: A Medical History of Humanity Porter, R. (W. W. Norton: New York, 1997) 
  • Drugs and Narcotics in History Porter, R.; Teich, M., Eds. (Cambridge University Press: Cambridge, U.K., 1997) 
  • Readings in American Health Care: Current Issues in Socio-Historical Perspective, Rothstein, W. G., Ed. (University of Wisconsin Press: Madison, 1995) 
  • Readings in American Health Care: Current Issues in Socio-Historical Perspective Rothstein, W. G., Ed. (University of Wisconsin Press: Madison, WI, 1995) 
  • Blood: An Epic History of Medicine and Commerce Starr, D. (Alfred A. Knopf: New York, 1999) 
  • The Social Transformation of American Medicine Starr, P. (Basic Books: New York, 1982) 
  • FDA History on the Web: www.fda.gov/oc/history/default.htm 

No wonder a generation of baby boomers was poised to demand a pill or vaccine for any and every ill that affected humankind—pharmaceutical protection from unwanted pregnancies, from mental upset, from disease. Rising expectations were reflected in the science and business arenas, where research was promoted and industry driven to attack a greater variety of human ills with the weapons of science. Technological fixes in the form of pills and chemicals seemed inevitable. How else could the idea of a war on cancer in the next decade be initiated with the same earnestness and optimism as the quest for a man on the moon? The 1969 success of Apollo 11 was the paradigm for the capabilities of technology.

On a darker note, the decade ended with a glimpse of the more frightening aspects of what biomedical technology could do, or at least what militarists wanted it to do. In 1969, the U.S. Department of Defense requested $10 million from Congress to develop a synthetic biological agent to which no natural immunity existed. Funding soon followed under the supervision of the CIA at Fort Detrick, MD.

Ultimately, although a new battery of technologies had become available in the 1960s, including HPLC, GC/MS, and machines to synthesize DNA and proteins, and new knowledge bases were developed—from cracking the genetic code to the discovery of restriction enzymes—the majority of these breakthroughs would not bear fruit until the 1970s and 1980s. The 1960s would instead be remembered primarily for the changes wrought by new pharmaceuticals and new social paradigms. As the decade ended, it was an open question as to whether the coming decade would bring the dawn of massive biological warfare or the rather expected nuclear holocaust (heightened in the world psyche by the Cuban Missile Crisis of 1962). Others speculated that the future would bring massive social collapse arising from the intergenerational breakdown in the West that many blamed on the success of pharmaceutical technology in developing and producing new and dangerous drugs.


© 2000 American Chemical Society
Analytical Chemistry Chemical and Engineering News Modern Drug Discovery Today's Chemist at Work

CASChemPortChemCenterPubs Page